ECE5725 Project
May 12, 2024
Xiangzhou Wei (xxw2) and Yanxing Zhang (yz2927)
In Figure 1, we demonstrate the implementation of three ultrasonic sensors in our project, enabling our robot to turn at the correct angle upon detecting obstacles. We also incorporated a gyroscope mpu6050 to ensure the robot goes as straight as possible. A PiCamera was used for streaming the robot's vision, serving as a surveillance camera.
Figure 2 shows the hardware diagram for the robot. The four ultrasonic sensors detect the distance and transmit it to the Raspberry Pi, and the gyroscope sensor makes the robot go in a straight line. The gyroscope's x, y angle change is transmitted to the robot’s motor PID algorithm, which adjusts the duty cycle of the PWM motor output. We use the gyroscope MPU-6050 from Adafruit, and the ultrasonic sensor is RCWL-1601.
The angle of rotation (yaw) of the gyroscope sensor represents the error of the PID algorithm and will provide feedback to the algorithm to adjust the PID’s output. Each motor has its own PID algorithm, making the car go a straight line.
We use three ultrasonic sensors to detect the barrier, one in the front and the other in the left and right. These sensors read the distance data continuously and send it to judge logic to determine when to turn left and when to turn right. The IMU and ultrasonic sensor were separated as a single thread from the motor control and PID algorithm for data persistence and logic separation.
Figure 3 shows the software diagram for the PID control. The angle of rotation (yaw) of the gyroscope sensor represents the error of the PID algorithm and will provide feedback to the algorithm to adjust the PID’s output. Each motor has its own PID algorithm, making the car go a straight line.
The barrier avoidance software design diagram is shown in Fig.3. The ultrasonic sensor sends the distance to the Raspberry Pi. When the distance reaches about 5cm, the car stops and stores the current direction. Then, it will turn left to avoid the barrier. If there is also a barrier when turning left, it will turn back and turn right to find another way to avoid the barrier. If there is also a barrier on the right, it stops. When meeting the barrier, there would be a flag to determine which ultrasonic sensor was used to detect the barrier. Other ultrasonic sensors would not work, and that is to prevent one issue: if the front sensor sensed the barrier and then turned left, the right sensor would also have sensed the distance between the car and the barrier; typically, when turning left, the distance between the right sensor and the barrier should be smaller than 5cm, which would trigger the car to turn to another direction. So, all the sensors would detect distance, but the one that detects the barrier first would be prioritized in our Raspberry Pi program.
The robot can stream video along the barrier avoidance path with the picamera. We set the stream to be a separate thread so that it would not affect the barrier avoidance function. The stream starts with robot movement and ends with the barrier avoidance program quitting. It was shown remotely in the browser. The Raspberry Pi is connected to the PiCamera, which captures video data. This data is then processed and streamed by a Flask Web Server running on the same Raspberry Pi. The video stream is made available to client devices over the network, demonstrating the flow of data from the camera through the Raspberry Pi and onto the client's screen via the web server.
In the first week, we wrote a program to test if the front ultrasonic sensor could work together with our design in lab3 by writing a program to make the robot stop when an obstacle was detected within 5 cm. One issue we discovered was that, when the vcc of the sensor was connected to the 5v of raspberry-pi, piTFT won’t display anything, so we disconnect the ultrasonic sensor’s vcc when we want to start the system and re-connect it afterwards.
In the second week, we tested our PID program to enable our robot's self-balancing function. The PID control algorithm is added with Kp of -78.5, Ki of 1, and Kd of 1. The PID uses the gyro’s Y-axis number to get the error and feedback to the PID to get the correct duty cycle. The ultrasonic sensor detects the distance and causes the robot to turn right at 90 degrees. The turning angle is also recorded from the gyroscope. We extended one more functionality of our robot using PiCamera. We tested it by capturing an image of the obstacle and saving it to our desktop.
In our last week, we further adjusted our PID to make the robot go straight. We implemented all three ultrasonic sensors as well to get the robot decide to turn left or right based on ultrasonic sensor detection: turn right if there was obstacle on the left, turn left if there was obstacle on the right. We placed boxes on the round and ran our program. Initially, we discovered that our threshold distance – the distance within which the robot should turn was too short so our robot would malfunction or not turn as expected. We, therefore, lengthened the threshold voltage to optimize our design. The provided code sets up a simple Flask web server to stream video from the Raspberry Pi Camera Module. Here’s a concise explanation of how this works:
To test the video streaming Flask application on the Raspberry Pi, we first ran the script to initiate the Flask server. With the PiCamera connected, the server started streaming JPEG images captured from the camera through the /video_feed endpoint. We then accessed this stream by navigating to http://localhost:5000/video_feed in a web browser on a device connected to the same network as the Raspberry Pi. This allowed us to visually confirm that the video from the PiCamera was being streamed live, demonstrating the functionality and responsiveness of our setup.
Eventually, we achieve all our objectives: our robot can successfully avoid obstacles from the three directions, and it can go in a straight line as well. Live video from the PiCamera is streamed throughout the travel.
We also tried to have the robot avoid the obstacle and have it return to its original direction. Eventually, we discarded this plan because the complexity of accurately recalibrating the robot's orientation after an avoidance maneuver proved to be too high for our current sensor setup and control algorithms. The challenges in maintaining precise directional control and the frequent errors in gyroscopic data led us to pursue simpler, more reliable navigation strategies.
The robot can turn right when detecting the barrier in the front or the front and left. It can turn left when there's a barrier in the front and right. The streaming video is shown in the browser.
One further improvement to the project is that we can attach a servo to the ultrasonic sensors, so it can detect broader directions and have better performance when avoiding barriers. Another improvement is that we could add computer vision capabilities to our robot so that it could identify what kind of obstacle it is. Moreover, we could integrate a pathfinding algorithm like A* or Dijkstra to improve the robot's navigation capabilities. We could use the data from all three ultrasonic sensors to make more informed decisions about avoiding obstacles and planning the route, leading to smoother and more efficient movements.
yz2927@cornell.edu
Designed the overall software architecture.
xxw2@cornell.edu
Implemented PiCamera and Tested the overall system.
# ECE5725 Final Project
# Lab Section: Monday
# Team Members: Xiangzhou Wei (xxw2), Yanxing Zhang (yz2927)
# Date: 2024-05-12
import time
import threading
import RPi.GPIO as GPIO
import smbus
import io
import os
import datetime
from flask import Flask, Response
import picamera
from time import sleep
# Flask app setup for video streaming
app = Flask(__name__)
video_thread = None
streaming_active = False
# Constants for MPU6050
PWR_MGMT_1 = 0x6B
SMPLRT_DIV = 0x19
CONFIG = 0x1A
GYRO_CONFIG = 0x1B
INT_ENABLE = 0x38
GYRO_ZOUT_H = 0x47
GYRO_SCALE_MODIFIER = 14
# GPIO setup
GPIO.setmode(GPIO.BCM)
GPIO.setwarnings(False)
left_pwm_pin = 26
right_pwm_pin = 16
left_dir_pin1 = 5
left_dir_pin2 = 6
right_dir_pin1 = 20
right_dir_pin2 = 21
front_trigger_pin = 17
front_echo_pin = 18
left_trigger_pin = 19
left_echo_pin = 13
right_trigger_pin = 12
right_echo_pin = 25
GPIO.setup([left_pwm_pin, right_pwm_pin, left_dir_pin1, left_dir_pin2, right_dir_pin1, right_dir_pin2], GPIO.OUT)
GPIO.setup([front_trigger_pin, left_trigger_pin, right_trigger_pin], GPIO.OUT)
GPIO.setup([front_echo_pin, left_echo_pin, right_echo_pin], GPIO.IN)
GPIO.setup(27, GPIO.IN, pull_up_down=GPIO.PUD_UP)
left_pwm = GPIO.PWM(left_pwm_pin, 50)
right_pwm = GPIO.PWM(right_pwm_pin, 50)
left_pwm.start(0)
right_pwm.start(0)
# Initialize SMBus for MPU6050
bus = smbus.SMBus(1)
Device_Address = 0x68
def MPU_Init():
bus.write_byte_data(Device_Address, SMPLRT_DIV, 7)
bus.write_byte_data(Device_Address, PWR_MGMT_1, 1)
bus.write_byte_data(Device_Address, CONFIG, 0)
bus.write_byte_data(Device_Address, GYRO_CONFIG, 24)
bus.write_byte_data(Device_Address, INT_ENABLE, 1)
def read_raw_data(addr):
high = bus.read_byte_data(Device_Address, addr)
low = bus.read_byte_data(Device_Address, addr + 1)
value = ((high << 8) | low)
if value > 32768:
value -= 65536
return value
MPU_Init()
left_init_speed = 70
right_init_speed = 70
distance_threshold = 26
# Flask route for video streaming
@app.route('/')
def video_feed():
global streaming_active
streaming_active = True
return Response(generate_frames(), mimetype='multipart/x-mixed-replace; boundary=frame')
def generate_frames():
with picamera.PiCamera() as camera:
camera.resolution = (640, 480)
camera.framerate = 24
stream = io.BytesIO()
while streaming_active:
for _ in camera.capture_continuous(stream, 'jpeg', use_video_port=True):
stream.seek(0)
yield b'--frame\r\nContent-Type: image/jpeg\r\n\r\n' + stream.read() + b'\r\n'
stream.seek(0)
stream.truncate()
def start_video_stream():
global video_thread
video_thread = threading.Thread(target=lambda: app.run(host='0.0.0.0', port=5000, threaded=True))
video_thread.start()
def stop_video_stream():
global streaming_active
streaming_active = False
if video_thread is not None:
video_thread.join()
def set_motor_speeds(left_speed, right_speed):
clamped_left_speed = max(0, min(100, left_speed))
clamped_right_speed = max(0, min(100, right_speed))
GPIO.output(left_dir_pin1, 1)
GPIO.output(left_dir_pin2, 0)
GPIO.output(right_dir_pin1, 0)
GPIO.output(right_dir_pin2, 1)
left_pwm.ChangeDutyCycle(clamped_left_speed)
right_pwm.ChangeDutyCycle(clamped_right_speed)
def wait_for_button_press():
print("Waiting for button press on GPIO 27...")
while True:
if GPIO.input(27) == GPIO.LOW:
print("Button pressed, starting the robot.")
break
time.sleep(0.1)
def stop_motors():
left_pwm.ChangeDutyCycle(0)
right_pwm.ChangeDutyCycle(0)
def get_distance(trigger_pin, echo_pin):
GPIO.output(trigger_pin, True)
time.sleep(0.0001)
GPIO.output(trigger_pin, False)
start = time.time()
while GPIO.input(echo_pin) == 0:
start = time.time()
stop = time.time()
while GPIO.input(echo_pin) == 1:
stop = time.time()
elapsed = stop - start
dist = (elapsed * 34300) / 2
return dist
# Shared variables to hold distance measurements
distance_data = {'front': float('inf'), 'left': float('inf'), 'right': float('inf')}
def read_ultrasonic_sensors():
while True:
distance_data['front'] = get_distance(front_trigger_pin, front_echo_pin)
distance_data['left'] = get_distance(left_trigger_pin, left_echo_pin)
distance_data['right'] = get_distance(right_trigger_pin, right_echo_pin)
time.sleep(0.1) # Reduce the load on the CPU
def execute_turn_left(turn_angle):
current_angle = 0
last_t = time.time()
set_motor_speeds(0, right_init_speed)
while abs(current_angle) < abs(turn_angle):
current_t = time.time()
dt = current_t - last_t
gyro_z = read_raw_data(GYRO_ZOUT_H) / GYRO_SCALE_MODIFIER
current_angle += gyro_z * dt
print(current_angle)
last_t = current_t
stop_motors()
def execute_turn_right(turn_angle):
current_angle = 0
last_t = time.time()
set_motor_speeds(left_init_speed, 0)
while abs(current_angle) < abs(turn_angle):
current_t = time.time()
dt = current_t - last_t
gyro_z = read_raw_data(GYRO_ZOUT_H) / GYRO_SCALE_MODIFIER
current_angle += gyro_z * dt
print(current_angle)
last_t = current_t
stop_motors()
class PID:
def __init__(self, Kp, Ki, Kd):
self.Kp = Kp
self.Ki = Ki
self.Kd = Kd
self.previous_error = 0
self.integral = 0
def compute(self, error, dt):
self.integral += error * dt
derivative = (error - self.previous_error) / dt
output = self.Kp * error + self.Ki * self.integral + self.Kd * derivative
self.previous_error = error
return output
def move_forward():
pid = PID(Kp=5, Ki=0.7, Kd=0.01)
target_angle = 0
last_t = time.time()
while True:
if distance_data['front'] <= distance_threshold:
stop_motors()
break
current_t = time.time()
dt = current_t - last_t
gyro_z = read_raw_data(GYRO_ZOUT_H) / GYRO_SCALE_MODIFIER
angle_error = target_angle - gyro_z * dt
correction = pid.compute(angle_error, dt)
left_speed = left_init_speed - correction
right_speed = right_init_speed + correction
set_motor_speeds(left_speed, right_speed)
last_t = current_t
time.sleep(0.1)
# Start the ultrasonic sensor thread
ultrasonic_thread = threading.Thread(target=read_ultrasonic_sensors)
ultrasonic_thread.daemon = True # Terminate with the main program
ultrasonic_thread.start()
# Main loop
try:
wait_for_button_press()
start_video_stream() # Start video streaming when the robot starts
while True:
front_distance = distance_data['front']
left_distance = distance_data['left']
right_distance = distance_data['right']
if front_distance <= distance_threshold and left_distance <= distance_threshold and right_distance <= distance_threshold:
stop_motors()
elif front_distance <= distance_threshold and left_distance <= distance_threshold:
execute_turn_right(90)
elif front_distance <= distance_threshold and right_distance <= distance_threshold:
execute_turn_left(90)
elif front_distance <= distance_threshold:
execute_turn_right(90)
else:
move_forward()
except KeyboardInterrupt:
stop_motors()
GPIO.cleanup()
stop_video_stream() # Stop the video streaming when the robot stops
print("Test stopped by user.")